Description

Create an interactive mobile AR experience, focusing on spatial awareness, tracking, anchoring, interaction, and environmental integration. Make the AR experience context-aware to augment a real or simulated physical object, emphasizing reliable registration and meaningful user interaction on handheld devices.

Goals

  • Configure and develop an AR Foundation mobile project (iOS or Android).
  • Implement spatial awareness (planes, occlusion) and persistent augmentation through anchors or image tracking.
  • Design and enable user interaction with augmented content (touch-based).
  • Ensure visual consistency of virtual content with real-world lighting cues.
  • Handle degradation (e.g., tracking loss) gracefully.
  • Communicate design intent and robustness through documentation.

Requirements

Create a mobile AR application that augments a physical or simulated engineering object or environment relevant to XFactory. Required features:

  • Spatial Awareness & Occlusion: Detect planes (and/or use meshing if supported) and implement occlusion so virtual objects respect real geometry.
  • Persistent Anchoring or Image Tracking: Use either tracked images (reference image library) or anchor-on-tap such that augmentation persists stably across device movement and session continuity. Ideally, incorporate both.
  • Augmented Content Interaction: Enable the user to interact via touch (e.g., toggle states, reveal metadata, reposition augmentation) with appropriate feedback.
  • Visual Consistency: Use environment probes or approximated lighting to make virtual content blend with the scene (e.g., basic light estimation, consistent shadows/highlights).
  • Tracking Degradation Handling: Detect degraded tracking (e.g., lost plane or anchor) and provide recovery cues or fallback behavior (e.g., visual hint, re-localization prompt).
  • Mobile Focus: Implementation must target mobile devices (no requirement for AR headsets). If device testing is not possible for every student, include instructions for using device simulators/emulators where feasible and clearly note any limitations.

Prepare a brief Performance/Design Reflection Document (≤ 2 pages) describing at least three considerations or lightweight implementations related to AR usability or perceived performance (e.g., improving realism through occlusion, ensuring persistent anchoring across sessions, handling degraded tracking with clear recovery cues). Implementation can be conceptual, but must reference applied settings or design decisions (e.g., plane detection parameters, light estimation use, fallback interactions).

At a minimum, include one persistent augmentation point (image or anchor), plane detection + occlusion, a touch interaction that changes the augmentation, and a degradation/recovery mechanism. Avoid overly elaborate content that distracts from interaction fidelity and robustness; focus on stability and user clarity.

Graduate Extension

Graduate students extend the core assignment by incorporating insights from recent research in mobile AR with engineering relevance.

  1. Targeted Literature Review:
    • Select 3–4 peer-reviewed papers from top-tier venues such as IEEE ISMAR, IEEE VR, ACM CHI, ACM UIST, ACM DIS, or similarly reputable XR/interaction conferences. Suggested topic areas: tracking and registration fidelity, multimodal input in AR (e.g., touch + contextual sensing), context-aware augmentation, industrial AR use cases (assembly guidance, maintenance, real-time monitoring), persistent spatial anchoring under drift, or user perception of AR consistency.
    • Summarize each paper (≤250 words) focusing on: the engineering problem, AR technique or insight, and how it informs stable/mobile AR system design.
  2. Research-Informed Enhancement Proposal:
    • Define one concrete enhancement to your AR experience inspired by the literature. Examples include a hybrid persistence mechanism combining image fiducials with anchor relocalization, context-aware content adjustment based on detected environmental cues (e.g., adapt overlay visibility under changing light or occlusion), or multimodal augmentation triggers combining touch with inferred context (e.g., proximity-based hints).
    • Write a Research Memo Document (≤ 3 pages) that connects selected literature to the enhancement idea, describes the design and intended integration (conceptual or partially implemented), and discusses expected robustness benefits and limitations in an engineering deployment.
  3. Optional Mini Evaluation:
    • (Encouraged) Perform a simple robustness test (peer or self-run) under at least two disturbance conditions (e.g., partial occlusion, lighting change, device movement) and log the augmentation persistence or recovery success.
    • Include a short reflection in your Research Memo Document on what worked, what failed, and how literature insights could mitigate issues.

Submission

Deliverables

  • UnityPackage file (.unitypackage) containing your AR Foundation mobile project work. Export only your personal folder created inside Assets/ and named with your full name (e.g., Assets/FirstName_LastName/).
  • A README (PDF) with setup/deployment instructions for mobile testing, description of chosen augmentation target, and how to exercise interactions.
  • Evidence of spatial awareness + occlusion, persistent augmentation, interaction, and degradation handling (screenshots, a short demo video, or log).
  • (Graduate students only) Literature summaries, research memo, and optional evaluation reflection/data.

Guidelines

  • Submissions must be made individually.
  • Submit your .unitypackage file, README, Performance/Design Reflection Document, and (if applicable) Research Memo via Canvas.
  • To create your submission:
    1. In the Unity Project window, right-click your personal folder (Assets/FirstName_LastName).
    2. Select Export Package….
    3. Check Include Dependencies (to ensure prefabs, scripts, and scenes are included).
    4. Name your file AE_FirstName_LastName.unitypackage.
  • Double-check your UnityPackage by importing it into a fresh copy of the base project to confirm nothing is missing.
  • Your README must include:
    • Device platform target (Android/iOS).
    • Steps to reproduce the AR experience.
    • Description of interaction and persistence mechanisms.
    • (For graduate students) Full citations of literature sources in standard reference format.
  • Provide any reference images or assets required to reproduce image tracking targets.
  • Clearly note any limitations if full mobile deployment/testing was not possible (e.g., due to lack of hardware access).
  • Filename conventions:
    • AE_FirstName_LastName.unitypackage.
    • README_FirstName_LastName.pdf.
    • PerformanceReflection_FirstName_LastName.pdf.
    • ResearchMem_FirstName_LastName.pdf (graduate students only).

Grading Rubric

Undergraduate Core (100 points)

Criterion Description Points
Spatial Awareness & Occlusion Plane detection and occlusion implemented reliably; virtual content respects real geometry. 25
Explanation of Spatial Data Use Clarity on how spatial data informs augmentation behavior. 10
Persistent Anchoring / Image Tracking Stable augmentation across movement; lifecycle handled; fallback mechanism. 20
Interaction & Feedback Touch-based interaction works; responsive feedback provided. 15
Visual Consistency & Contextual Integration Lighting approximation/environment blending makes augmentation sensible. 15
Degradation Handling Detects tracking loss; user guided to recover gracefully. 10
Documentation & Reproducibility Clear README, reproduction steps, required assets provided. 10
Scope & Coherence Focused augmentation; avoids unnecessary complexity. 5

Graduate Extension Add-On (up to 25 points)

Criterion Description Points
Literature Review Quality Well-chosen papers from allowed venues; summaries tied to mobile AR engineering. 8
Enhancement Framing & Insight Strong linkage from literature to proposed enhancement; thoughtful design proposal. 9
Optional Evaluation Reflection Evidence or thoughtful analysis of robustness tests; insights tied back to literature. 8